base distribution
- North America > Canada > Alberta (0.14)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.04)
- (3 more...)
- Research Report > Experimental Study (1.00)
- Research Report > New Finding (0.93)
- Asia > Singapore (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Switzerland > Geneva > Geneva (0.15)
- Europe > Finland > Uusimaa > Helsinki (0.05)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.05)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- Asia > Japan > Honshū > Chūbu > Nagano Prefecture > Nagano (0.04)
- North America > United States > Colorado (0.04)
- (4 more...)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.68)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.46)
Generator-based Graph Generation via Heat Diffusion
Stephenson, Anthony, Gallagher, Ian, Nemeth, Christopher
Graph generative modelling has become an essential task due to the wide range of applications in chemistry, biology, social networks, and knowledge representation. In this work, we propose a novel framework for generating graphs by adapting the Generator Matching (arXiv:2410.20587) paradigm to graph-structured data. We leverage the graph Laplacian and its associated heat kernel to define a continous-time diffusion on each graph. The Laplacian serves as the infinitesimal generator of this diffusion, and its heat kernel provides a family of conditional perturbations of the initial graph. A neural network is trained to match this generator by minimising a Bregman divergence between the true generator and a learnable surrogate. Once trained, the surrogate generator is used to simulate a time-reversed diffusion process to sample new graph structures. Our framework unifies and generalises existing diffusion-based graph generative models, injecting domain-specific inductive bias via the Laplacian, while retaining the flexibility of neural approximators. Experimental studies demonstrate that our approach captures structural properties of real and synthetic graphs effectively.
- Europe > United Kingdom > England (0.04)
- Oceania > Australia (0.04)
- North America > United States (0.04)
- Health & Medicine (0.46)
- Information Technology (0.34)
Conditional Normalizing Flows for Forward and Backward Joint State and Parameter Estimation
Lagunowich, Luke S., Tong, Guoxiang Grayson, Schiavazzi, Daniele E.
Traditional filtering algorithms for state estimation -- such as classical Kalman filtering, unscented Kalman filtering, and particle filters - show performance degradation when applied to nonlinear systems whose uncertainty follows arbitrary non-Gaussian, and potentially multi-modal distributions. This study reviews recent approaches to state estimation via nonlinear filtering based on conditional normalizing flows, where the conditional embedding is generated by standard MLP architectures, transformers or selective state-space models (like Mamba-SSM). In addition, we test the effectiveness of an optimal-transport-inspired kinetic loss term in mitigating overparameterization in flows consisting of a large collection of transformations. We investigate the performance of these approaches on applications relevant to autonomous driving and patient population dynamics, paying special attention to how they handle time inversion and chained predictions. Finally, we assess the performance of various conditioning strategies for an application to real-world COVID-19 joint SIR system forecasting and parameter estimation.
- North America > United States > California > San Francisco County > San Francisco (0.05)
- North America > United States > Indiana > St. Joseph County > Notre Dame (0.04)
- North America > United States > California > Santa Clara County > Stanford (0.04)
- Europe > Croatia > Primorje-Gorski Kotar County > Rijeka (0.04)
- Health & Medicine > Epidemiology (0.91)
- Health & Medicine > Therapeutic Area > Infections and Infectious Diseases (0.91)
- Health & Medicine > Therapeutic Area > Immunology (0.69)
Learning Optimal Flows for Non-Equilibrium Importance Sampling
Many applications in computational sciences and statistical inference require the computation of expectations with respect to complex high-dimensional distributions with unknown normalization constants, as well as the estimation of these constants. Here we develop a method to perform these calculations based on generating samples from a simple base distribution, transporting them by the flow generated by a velocity field, and performing averages along these flowlines. This non-equilibrium importance sampling (NEIS) strategy is straightforward to implement and can be used for calculations with arbitrary target distributions. On the theory side, we discuss how to tailor the velocity field to the target and establish general conditions under which the proposed estimator is a perfect estimator with zero-variance. We also draw connections between NEIS and approaches based on mapping a base distribution onto a target via a transport map. On the computational side, we show how to use deep learning to represent the velocity field by a neural network and train it towards the zero variance optimum. These results are illustrated numerically on benchmark examples (with dimension up to $10$), where after training the velocity field, the variance of the NEIS estimator is reduced by up to $6$ orders of magnitude than that of a vanilla estimator. We also compare the performances of NEIS with those of Neal's annealed importance sampling (AIS).
A General Approach to Visualizing Uncertainty in Statistical Graphics
Petek, Bernarda, Nabergoj, David, Štrumbelj, Erik
We present a general approach to visualizing uncertainty in static 2-D statistical graphics. If we treat a visualization as a function of its underlying quantities, uncertainty in those quantities induces a distribution over images. We show how to aggregate these images into a single visualization that represents the uncertainty. The approach can be viewed as a generalization of sample-based approaches that use overlay. Notably, standard representations, such as confidence intervals and bands, emerge with their usual coverage guarantees without being explicitly quantified or visualized. As a proof of concept, we implement our approach in the IID setting using resampling, provided as an open-source Python library. Because the approach operates directly on images, the user needs only to supply the data and the code for visualizing the quantities of interest without uncertainty. Through several examples, we show how both familiar and novel forms of uncertainty visualization can be created. The implementation is not only a practical validation of the underlying theory but also an immediately usable tool that can complement existing uncertainty-visualization libraries.
- Europe > Slovenia > Central Slovenia > Municipality of Ljubljana > Ljubljana (0.05)
- Europe > France (0.04)
Amortized Inference of Multi-Modal Posteriors using Likelihood-Weighted Normalizing Flows
Across diverse domains--from complex systems and finance to high-energy physics and astrophysics--scientific inquiry often relies on deriving theoretical parameters from observational data [1]. At the core of this challenge lies the inverse problem: inferring the posterior distribution of theoretical parameters given a set of observables [2]. Traditional approaches for posterior estimation rely on sampling algorithms such as Markov Chain Monte Carlo (MCMC) [3, 4] and Nested Sampling (NS) [5]. In astrophysics and cosmology, implementations like emcee [6] and dynesty [7] have become standard tools. While these frameworks are statistically robust, they suffer significantly from the curse of dimensionality. In real-world scenarios, where the parameter space is high-dimensional and the likelihood function relies on computationally expensive simulators (e.g., in particle physics phenomenology [8]), convergence can take weeks or even months. Recent advances in machine learning have introduced Normalizing Flows (NFs) as a powerful alternative for probabilistic modelling [9, 10]. By learning a bijective mapping between a simple base distribution (e.g., a Gaussian) and the complex target distribution, NFs allow for exact density estimation and efficient sampling [11] from the target distribution. Modern architectures, such as RealNVP [12] and Neural Spline Flows [13], offer enough expressivity to model highly complex distributions.
- Asia > India > West Bengal > Kolkata (0.04)
- Asia > India > Uttar Pradesh (0.04)